About The Role
We are seeking a Data Engineer to join our team and leverage their expertise in Snowflake, dbt, and data transformation tools to build and maintain scalable data pipelines. You will be responsible for extracting, transforming, and loading (ETL) data from Microsoft SQL Server (MSSQL) into our Snowflake data warehouse, ensuring clean, reliable data for analysis and reporting.
Responsibilities
- Design, develop, and maintain data pipelines using Snowflake, dbt, or Coalesce.
- Write and optimize SQL queries within dbt for efficient data transformation in Snowflake.
- Utilize Coalesce or similar data transformation tools to migrate data from MSSQL to Snowflake.
- Cleanse, validate, and transform data to ensure accuracy and consistency.
- Design and implement data models in dbt to transform raw data into analytical-ready datasets.
- Automate data pipelines and ensure their reliability and scalability.
- Collaborate with data analysts, data scientists, and business stakeholders to understand data needs and translate them into technical solutions.
- Develop and implement unit tests for data pipelines and data models.
- Monitor and troubleshoot data pipelines for any issues.
- Document data pipelines and models for future reference.
Qualifications
- Bachelor's degree in Computer Science, Data Science, Statistics, or a related field
- Minimum 2+ years of experience as a Data Engineer or similar role.
- Proven experience with Snowflake, including writing SQL queries and leveraging its functionalities.
- Expertise in dbt, including model development, testing, and deployment.
- Experience with data transformation tools like dbt, Coalesce, or Hevo Data.
- Strong understanding of data warehousing concepts and best practices.
- Experience with ETL/ELT processes.
- Excellent SQL programming skills (T-SQL and Snowflake SQL).
- Experience with version control systems (e.g., Git).
- Familiarity with data quality concepts and techniques.
- Excellent communication and collaboration skills.
Bonus Points
- Experience with cloud platforms (e.g., AWS, Azure).
- Experience with data orchestration tools (e.g., Airflow, Luigi).
- Experience with data governance and security practices.